Both theoretical and empirical evidence suggest that, in many markets with standards competition, network effects make the strong grow stronger and can "tip" the market toward a single, winner-take-all standard. We hypothesize, however, that low cost digital conversion technologies, which facilitate easy compatibility across competing standards, may reduce the strength of these network effects. We empirically test our hypotheses in the context of the digital flash memory card market. We first test for the presence of network effects in this market and find that network effects, as measured here, are associated with a significant positive price premium for leading flash memory card formats. We then find that the availability of digital converters reduces the price premium of the leading flash card formats and reduces the overall concentration in the flash memory market. Thus, our results suggest that, in the presence of low cost conversion technologies and digital content, the probability of market dominance can be lessened to the point where multiple, otherwise incompatible, standards are viable. Our conclusion that the presence of converters weakens network effects implies that producers of non-dominant digital goods standards benefit from the provision of conversion technology. Our analysis thus aids managers seeking to understand the impact of converters on market outcomes, and contributes to the existing literature on network effects by providing new insights into how conversion technologies can affect pricing strategies in these increasingly important digital settings.
In markets that exhibit network effects, the presence of digital conversion technologies provides an alternative mechanism to achieve compatibility. This study examines the impact of conversion technologies on market equilibrium in the context of sequential duopoly competition and proprietary technology standards. We analyze this question by departing from the extant literature to endogenize the decision to provide a converter and incorporate explicit negotiations between firms concerning the extent of conversion. We argue that these choices better reflect the environment facing firms in digital goods industries and find that these decisions change some of the established results in the literature. Specifically, we find that unless network effects are very large, the subgame-perfect equilibrium (SPNE) involves firms' agreeing to provide digital converters at a sufficiently low price to all consumers. At this equilibrium, both the entrant and the incumbent are better off because the provision of converters alleviates price competition in the market and leads to both higher product revenues and higher proceeds from the sale of converters. Moreover, under some circumstances, the provision of converters is welfare enhancing. These findings have important implications for research and practice in the adoption of new digital goods as the introduction of conversion technologies can reduce the social costs of standardization without compromising the benefits of network effects.
The article offers information on the development and the changes of the editorship of the journal "Information Systems Research (IRS) in the U.S. It states that E. Burton Swanson is the first appointed editor-in-chief of the journal in 1987, where editorial policy and accomplishments are being highlights. Moreover, the second editorship is passed to John Leslie King in 1992 and resolve two major issues such as work submission of top researchers in the field and quality of work being published. Furthermore, the third editorship is passed down to Izak Benbasat in 1999, where he established a Senior Editor Board.
In this paper we develop a learning-mediated model of offshore software project productivity and quality to examine whether widely adopted structured software processes are effective in mitigating the negative effects of work dispersion in offshore software development. We explicate how the key process areas of the capability maturity model (CMM) can be utilized as a platform to launch learning routines in offshore software development and thereby explain why some offshore software development process improvement initiatives are more effective than others. We validate our learning-mediated model of offshore software project performance by utilizing data collected from 42 offshore software projects of a large firm that operates at the CMM level-5 process maturity. Our results indicate that investments in structured processes mitigate the negative effects of work dispersion in offshore software development. We also find that the effect of software process improvement initiatives is mediated through investments in process-based learning activities. These results imply that investments in structured processes and the corresponding process-based learning activities can be an economically viable way to counter the challenges of work dispersion and improve offshore project performance. We discuss the implication of these results for the adoption of normative process models by offshore software firms.
Innovation researchers have known for some time that a new information technology may be widely acquired, but then only sparsely deployed among acquiring firms. When this happens, the observed pattern of cumulative adoptions will vary depending on which event in the assimilation process (i.e., acquisition or deployment) is treated as the adoption event. Instead of mirroring one another, a widening gap--termed here an assimilation gap--will exist between the cumulative adoption curves associated with the alternatively conceived adoption events. When a pronounced assimilation gap exists, the common practice of using cumulative purchases or acquisitions as the basis for diffusion modeling can present an illusory picture of the diffusion process--leading to potentially erroneous judgments about the robustness of the diffusion process already observed, and of the technology's future prospects. Researchers may draw inappropriate theoretical inferences about the forces driving diffusion. Practitioners may commit to a technology based on a belief that pervasive adoption is inevitable, when it is not.This study introduces the assimilation gap concept, and develops a general operational measure derived from the difference between the cumulative acquisition and deployment patterns. It describes how two characteristics--increasing returns to adoption and knowledge barriers impeding adoption--separately and in combination may serve to predispose a technology to exhibit a pronounced gap. It develops techniques for measuring assimilation gaps, for establishing whether two gaps are significantly different from each other, and for establishing whether a particular gap is absolutely large enough to be of substantive interest. Finally, it demonstrates these techniques in an analysis of adoption data for three prominent innovations in software process technology--relational database management systems (RDBs), general purpose fourth generation languages (4GLs), and computer aided so...
Much has been written in recent years about the changes in corporate strategies and industry structures associated with electronic coordination of market activities. This paper considers the advent of electronic market coordination in the home mortgage industry, focusing on Computerized Loan Origination (CLO) systems. Case studies of five CLOs (First Boston's Shelternet, PAC's LoanExpress, American Financial Net-work's Rennie Mae, Prudential's CLOS, and Citicorp's Mortgage Power Plus) reveal a range of system functionalities. Predictions from the Electronic Markets Hypothesis (EMH) are tested against the empirical results of the five case studies. As suggested by the EMH, financial intermediaries have been threatened by the introduction of CLOS, and in some cases opposition has been mounted against the systems. On the other hand, despite the availability of the technology and mortgages' seemingly favorable characteristics as an electronically mediated market product, the industry has not been fundamentally changed by the introduction of these systems, despite more than a decade of experience with them. Of the two case studies that could be characterized as electronic markets, neither continues to exist in that form today. And the system with the largest dollar volume of mortgages of the five is best characterized as an electronic hierarchy. These results suggest that either the full results predicted by the EMH require a longer gestation period or that the underlying hypothesis will require augmentation in order to fully explain the results in the home mortgage market.
The information systems (IS) development activity in large organizations is a source of increasing cost and concern to management. IS development projects are often over-budget, late, costly to maintain, and not done to the satisfaction of the requesting user. These problems exist, in part, due to the organization of the IS development process, where information systems development is typically assigned by the user (principal) to a systems developer (agent). These two parties do not have perfectly congruent goals, and therefore a contract is developed to specify their relationship. An inability to directly monitor the agent requires the use of performance measures, or metrics, to represent the agent's actions to the principal. The use of multiple measures is necessary given the multi-dimensional nature of successful systems development. In practice such contracts are difficult to develop satisfactorily, due in part to an inability to specify appropriate metrics. This paper develops a principal-agent model that provides a set of decision criteria for the principal to use to develop an incentive compatible contract for the agent. These criteria include the precision and the sensitivity of the performance metric. After presenting the formal model, some current software development metrics are discussed to illustrate how the model can be used to provide a theoretical foundation and a formal vocabulary for performance metric analysis. The model is also used in a positive (descriptive) manner to explain why current practice emphasizes metrics that possess relatively high levels of sensitivity and precision. Finally, some suggestions are made for the improvement of current metrics based upon these criteria.